A Smooth and Locally Sparse Estimator for Functional Linear Regression via Functional SCAD Penalty
نویسندگان
چکیده
Zhenhua Lin1, Jiguo Cao2, Liangliang Wang3 and Haonan Wang4 1Department of Statistical Sciences, University of Toronto, Toronto, ON, Canada. Email: [email protected] 2Department of Statistics and Actuarial Science, Simon Fraser University, Burnaby, BC, Canada. Email: [email protected] 3Department of Statistics and Actuarial Science, Simon Fraser University, Burnaby, BC, Canada. Email: [email protected] 4Department of Statistics, Colorado State University, Fort Collins, CO, U.S.A. Email: [email protected]
منابع مشابه
Some Perspectives of Smooth and Locally Sparse Estimators
In this thesis we develop some new techniques for computing smooth and meanwhile locally sparse (i.e. zero on some sub-regions) estimators of functional principal components (FPCs) in functional principal component analysis (FPCA) and coefficient functions in functional linear regression (FLR). Like sparse models in ordinary data analysis, locally sparse estimators in functional data analysis e...
متن کاملLocally Sparse Estimator for Functional Linear Regression Models
A new locally sparse (i.e., zero on some subregions) estimator for coefficient functions in functional linear regression models is developed based on a novel functional regularization technique called “fSCAD”. The nice shrinkage property of fSCAD allows the proposed estimator to locate null subregions of coefficient functions without over shrinking non-zero values of coefficient functions. Addi...
متن کاملAsymptotic oracle properties of SCAD-penalized least squares estimators
We study the asymptotic properties of the SCAD-penalized least squares estimator in sparse, high-dimensional, linear regression models when the number of covariates may increase with the sample size. We are particularly interested in the use of this estimator for simultaneous variable selection and estimation. We show that under appropriate conditions, the SCAD-penalized least squares estimator...
متن کاملApproximate message passing for nonconvex sparse regularization with stability and asymptotic analysis
We analyse a linear regression problem with nonconvex regularization called smoothly clipped absolute deviation (SCAD) under an overcomplete Gaussian basis for Gaussian random data. We propose an approximate message passing (AMP) algorithm considering nonconvex regularization, namely SCAD-AMP, and analytically show that the stability condition corresponds to the de Almeida–Thouless condition in...
متن کاملSCAD-Penalized Regression in High-Dimensional Partially Linear Models
We consider the problem of simultaneous variable selection and estimation in partially linear models with a divergent number of covariates in the linear part, under the assumption that the vector of regression coefficients is sparse. We apply the SCAD penalty to achieve sparsity in the linear part and use polynomial splines to estimate the nonparametric component. Under reasonable conditions, i...
متن کامل